Skip to content

Conversation

@IMbackK
Copy link
Collaborator

@IMbackK IMbackK commented Aug 13, 2025

Unfortunately due to issues in amd_hip_bf16.h, we can not longer support rocm <6.1

I am not happy about this, but at least rocm 6.1 is what is currently in debian stable, so i believe we still cover all reasonable installations of rocm. There where no changes to hardware support between rocm 5.5 and rocm 6.1 so no hardware is left behind either.

@github-actions github-actions bot added Nvidia GPU Issues specific to Nvidia GPUs devops improvements to build systems and github actions ggml changes relating to the ggml tensor library for machine learning labels Aug 13, 2025
@IMbackK IMbackK merged commit 29c8fbe into ggml-org:master Aug 13, 2025
47 checks passed
the-phobos pushed a commit to the-phobos/llama.cpp that referenced this pull request Aug 14, 2025
@CISC
Copy link
Collaborator

CISC commented Aug 15, 2025

@IMbackK Not related to this, but I've seen a couple of times now that the windows-latest-cmake-hip build gets stuck on the Install step for hours, any idea?

https://github.com/ggml-org/llama.cpp/actions/runs/16989547440/job/48165519619

Nexesenex added a commit to Nexesenex/croco.cpp that referenced this pull request Oct 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

devops improvements to build systems and github actions ggml changes relating to the ggml tensor library for machine learning Nvidia GPU Issues specific to Nvidia GPUs

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants